perm filename JNK.TEX[AM,DBL] blob sn#484175 filedate 1979-10-26 generic text, type C, neo UTF8
COMMENT āŠ—   VALID 00004 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	\input ijcai 	% Then CogSciJournal formatting macros
C00009 00003	\NSECP{Introduction}
C00014 00004	               \SSEC{Extending a Schematized Representation}
C00016 ENDMK
CāŠ—;
\input ijcai 	% Then CogSciJournal formatting macros
\def\TITL #1{
 \titlepage\ninepoint
 \runninglefthead{#1}
 \vfill
 \advcount8
 \runningrighthead{{ }} 
 \section{{ }}
 \eject
 \ctrline{\:=#1}
	\vskip 4pt plus 2pt
	\acpmark{\chd}{\csec}
	\noindent\ninepoint\!
}
\def\sectionskip{\penalty-60\vskip 3pt plus 2pt minus 1pt}
\def\ASEC #1{
 \titlepage\ninepoint
 \vfill\eject
 \advcount4
 \gdef\grrh{#1}
 \runningrighthead{#1}
 \section{App. \count4 }
 \sectionskip
 \asecbegin{\count4. #1}
 \setcount5 0
% \setcount9 0
 }
\def\NSECP #1{
 \titlepage\ninepoint
 \vfill\eject
 \advcount4
 \gdef\grrh{#1}
 \runningrighthead{#1}
 \section{\count4}
 \sectionskip
 \sectionbegin{\count4. #1}
 \setcount5 0
%	 \setcount9 0      FOR COG SCI, DON'T RESTART FOOTNOTE NUMBERS
 }
\hsize 6xgpin \vsize 8xgpin \maxdepth 2pt \parindent 19pt \topbaseline 10pt
\parskip 20pt plus 3pt minus 1pt \lineskip 8pt
\topskip 27pt plus 5pt minus 2pt  \botskip 3pt plus 9pt
\output{\baselineskip 0pt\lineskip0pt	% beginning of output routine, resets skips
	\vjust to 9.5xgpin{         % prepare the full page of this fixed height
 		\vskip 27pt	% no page nos at top of preface material
		\page 		% insert the page contents
		\vfill		 % extra space before the page number
	}			% completion of the \vjust
	\advcount0}		% increase page number by 1 and end output routine
\jpar 100

\TITL{COGNITIVE  ECONOMY}

%  \vskip 9pt

 \ctrline{\:=In a Fluid Task Environment}

 \vskip 17pt


{\:q Douglas B. Lenat, \  Frederick Hayes-Roth, \  and Philip Klahr}


\vskip 1.5xgpin   	% Footnote material for title page


\eightpoint

\parindent 0pt

Dr. Lenat is an assistant professor in
the Computer Science Department, Stanford University,
Stanford, Ca. 94305.
Drs. Hayes-Roth and Klahr are researchers in the Information Sciences
Department of the Rand Corporation, Santa Monica, Ca.

This paper describes work in progress, sponsored by the National
Science Foundation under grants MCS77-04440 and MCS77-03273.  It is
addressed to a technical audience familiar with the concepts of artificial
intelligence, knowledge representation, and knowledge engineering.  
Although it does present some concrete research results,
it is
being disseminated at this time primarily to stimulate discussion and interaction
with colleagues on these issues.

\ninepoint

\vfill

Running Title:  COGNITIVE ECONOMY


\setcount0 1

\vfill\eject

\output{\baselineskip 0pt\lineskip0pt	% beginning of output routine, resets skips
	\vjust to 8.5xgpin{         % prepare the full page of this fixed height
 		\ctrline{\:c --- \count0 --- }
		\vskip 20pt
		\page 		% insert the page contents
		\vfill		 % extra space before the page number
	}			% completion of the \vjust
	\advcount0}		% increase page number by 1 and end output routine

\tenpoint

{\bf Abstract}

\ninepoint

\vskip 1xgpin  		% Here goes the abstract

\noindent Intelligent systems can explore only tiny subsets of their potential
external and conceptual worlds.  To increase their effective
capacities, they must develop efficient forms of representation,
access, and operation.  
If forced to survive in a changing task environment,
most inferential systems could benefit
from
{\it learning}
about that environment and their own behavior.  For example,
they can exploit new schemata or different slot-names to simplify and
restructure their knowledge.  
In this paper we describe a program that automatically extends its
schematized representation of knowledge, by defining appropriate new slots
dynamically.  We then review some very general techniques for
increasing the efficiency of programs without sacrificing
expressibility:
caching, abstraction, and
expectation-simplified processing.
It is shown in detail how one of these, caching, regains the efficiency that
otherwise would be lost in adopting the interpretive slot-defining scheme
presented earlier.  

\vskip 11pt

\tenpoint

\vfill

\eject

\NSECP{Introduction}

\parindent 19pt

Computer programs, no less than biological creatures, must perform 
in an environment: an externally imposed set of demands, pressures,
opportunities, regularities.
{\it Cognitive economy} is the degree to which a program is adapted to its
environment, the extent to which
its internal capabilities (structures and processes) accurately
and efficiently reflect
its environmental niche.
If the environment changes frequently and radically, then the program
(or species or organization) should monitor those changes and be capable of
adapting itself to them dynamically.  This is cost effective when
the penalty for failure to adapt exceeds the sum of the following
costs: (i) The continuous overhead for monitoring to detect changes,
(ii) The onetime cost of providing a mechanism by which self-modification
can occur, and (iii) The occasional expense of employing that mechanism to
respond to a change which has been observed.  

\vskip 1.5xgpin

% 	Put the equation here -- in a box, with (1) at margin


There are two directions for exploring this phenomenon:

\noindent $\bullet $ 
Make the preceding equation less mushy.  Investigate what features of the
performer (program), task environment, changes in that environment, etc.,
make the ``monitoring & self-modification'' process cost effective.
Clearly, in some cases the penalty
for non-adaptation {\it is} high enough; 
as examples spanning several time scales
consider biological evolution (millenia),
the educational system of a culture (years),
the business meetings of a corporation (months),
the immune system of an organism (hours),
and
the nervous system of an organism (milliseconds).
In other cases, the above equation tips {\it against} adaptation as being
cognitively economical; 
to date, almost all computer programs (and cognitive models in general)
have been constructed to deal with a fixed task, or at least with a
fixed task environment [Newell & Simon 1972].  But the magnitude of the
phenomena we seek to model continues to grow, and we begin to
feel the confines of any single, unchanging model we hypothesize.
In the years ahead, our models --- be they in Lisp, equations, or prose ---
must become increasingly responsive.   In the conclusions of this paper,
we provide some further thoughts on this research direction.

\noindent $\bullet $ 
Operationalize the vague phrase ``monitoring & self-modification.''
In the case of computer models, how might they select (or {\sl discover})
timely new knowledge, new control algorithms,
new knowledge representations?   What are the difficulties encountered?
In this paper we present some
specific techniques by which such programs can monitor their
runtime environment (Appendix 4) 
and modify themselves to heighten their fitness to it (Section 2).
We describe how one particular program, Eurisko, dynamically defines useful
new slots automatically.

               \SSEC{Extending a Schematized Representation}


{\bf Summary:\  \sl
The Eurisko program extends it schematized representation
by defining new types of
slots. It uses a very simple grammar for defining new slots from old ones (legal
moves), and a corpus of heuristic rules for
guiding and constraining that process (plausible moves).}

\yskip

For a schematized representation,
``extension" could
occur by defining
new types of slots.  The Eurisko program (an extension of the AM 
program [Lenat 1976]) has this capability, and we shall
briefly describe how this mechanism was developed.


\vfill

\end